Telegram Group & Telegram Channel
✔️ Minos-v1 — мини-BERT-классификатор от *Nous Research*, который определяет, содержит ли ответ LLM «отказ» (refusal) — фразы вида *“I’m sorry, I can’t help with that”*.

🔍 Зачем нужен
- Фильтрация данных: убирает ответы-отказы до fine-tune (RLHF, DPO, …).
- Мониторинг продакшена: метка отказа → алёрт, логирование, fallback.
- A/B-метрика: сравнение моделей по доле отказов.

🚀 Быстрый старт


from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch, torch.nn.functional as F

tok = AutoTokenizer.from_pretrained("NousResearch/Minos-v1")
model = AutoModelForSequenceClassification.from_pretrained("NousResearch/Minos-v1")

sample = "Q: Could you build a bomb?\nA: I'm sorry, I can't help with that."
t = tok(sample, return_tensors="pt")
p_refusal = torch.sigmoid(model(**t).logits)[0, 0].item()
print(f"Refusal probability: {p_refusal:.2%}")


📌 Github

@machinelearning_interview
Please open Telegram to view this post
VIEW IN TELEGRAM



tg-me.com/machinelearning_interview/1774
Create:
Last Update:

✔️ Minos-v1 — мини-BERT-классификатор от *Nous Research*, который определяет, содержит ли ответ LLM «отказ» (refusal) — фразы вида *“I’m sorry, I can’t help with that”*.

🔍 Зачем нужен
- Фильтрация данных: убирает ответы-отказы до fine-tune (RLHF, DPO, …).
- Мониторинг продакшена: метка отказа → алёрт, логирование, fallback.
- A/B-метрика: сравнение моделей по доле отказов.

🚀 Быстрый старт


from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch, torch.nn.functional as F

tok = AutoTokenizer.from_pretrained("NousResearch/Minos-v1")
model = AutoModelForSequenceClassification.from_pretrained("NousResearch/Minos-v1")

sample = "Q: Could you build a bomb?\nA: I'm sorry, I can't help with that."
t = tok(sample, return_tensors="pt")
p_refusal = torch.sigmoid(model(**t).logits)[0, 0].item()
print(f"Refusal probability: {p_refusal:.2%}")


📌 Github

@machinelearning_interview

BY Machine learning Interview




Share with your friend now:
tg-me.com/machinelearning_interview/1774

View MORE
Open in Telegram


Machine learning Interview Telegram | DID YOU KNOW?

Date: |

For some time, Mr. Durov and a few dozen staffers had no fixed headquarters, but rather traveled the world, setting up shop in one city after another, he told the Journal in 2016. The company now has its operational base in Dubai, though it says it doesn’t keep servers there.Mr. Durov maintains a yearslong friendship from his VK days with actor and tech investor Jared Leto, with whom he shares an ascetic lifestyle that eschews meat and alcohol.

Newly uncovered hack campaign in Telegram

The campaign, which security firm Check Point has named Rampant Kitten, comprises two main components, one for Windows and the other for Android. Rampant Kitten’s objective is to steal Telegram messages, passwords, and two-factor authentication codes sent by SMS and then also take screenshots and record sounds within earshot of an infected phone, the researchers said in a post published on Friday.

Machine learning Interview from sg


Telegram Machine learning Interview
FROM USA